Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 23
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sci Rep ; 14(1): 6795, 2024 Mar 21.
Artigo em Inglês | MEDLINE | ID: mdl-38514669

RESUMO

Industrial advancements and utilization of large amount of fossil fuels, vehicle pollution, and other calamities increases the Air Quality Index (AQI) of major cities in a drastic manner. Major cities AQI analysis is essential so that the government can take proper preventive, proactive measures to reduce air pollution. This research incorporates artificial intelligence in AQI prediction based on air pollution data. An optimized machine learning model which combines Grey Wolf Optimization (GWO) with the Decision Tree (DT) algorithm for accurate prediction of AQI in major cities of India. Air quality data available in the Kaggle repository is used for experimentation, and major cities like Delhi, Hyderabad, Kolkata, Bangalore, Visakhapatnam, and Chennai are considered for analysis. The proposed model performance is experimentally verified through metrics like R-Square, RMSE, MSE, MAE, and accuracy. Existing machine learning models, like k-nearest Neighbor, Random Forest regressor, and Support vector regressor, are compared with the proposed model. The proposed model attains better prediction performance compared to traditional machine learning algorithms with maximum accuracy of 88.98% for New Delhi city, 91.49% for Bangalore city, 94.48% for Kolkata, 97.66% for Hyderabad, 95.22% for Chennai and 97.68% for Visakhapatnam city.

2.
Sci Rep ; 14(1): 7520, 2024 Mar 29.
Artigo em Inglês | MEDLINE | ID: mdl-38553492

RESUMO

The consumption of water constitutes the physical health of most of the living species and hence management of its purity and quality is extremely essential as contaminated water has to potential to create adverse health and environmental consequences. This creates the dire necessity to measure, control and monitor the quality of water. The primary contaminant present in water is Total Dissolved Solids (TDS), which is hard to filter out. There are various substances apart from mere solids such as potassium, sodium, chlorides, lead, nitrate, cadmium, arsenic and other pollutants. The proposed work aims to provide the automation of water quality estimation through Artificial Intelligence and uses Explainable Artificial Intelligence (XAI) for the explanation of the most significant parameters contributing towards the potability of water and the estimation of the impurities. XAI has the transparency and justifiability as a white-box model since the Machine Learning (ML) model is black-box and unable to describe the reasoning behind the ML classification. The proposed work uses various ML models such as Logistic Regression, Support Vector Machine (SVM), Gaussian Naive Bayes, Decision Tree (DT) and Random Forest (RF) to classify whether the water is drinkable. The various representations of XAI such as force plot, test patch, summary plot, dependency plot and decision plot generated in SHAPELY explainer explain the significant features, prediction score, feature importance and justification behind the water quality estimation. The RF classifier is selected for the explanation and yields optimum Accuracy and F1-Score of 0.9999, with Precision and Re-call of 0.9997 and 0.998 respectively. Thus, the work is an exploratory analysis of the estimation and management of water quality with indicators associated with their significance. This work is an emerging research at present with a vision of addressing the water quality for the future as well.

3.
Sci Rep ; 14(1): 2820, 2024 02 03.
Artigo em Inglês | MEDLINE | ID: mdl-38307901

RESUMO

This paper proposes and executes an in-depth learning-based image processing approach for self-picking apples. The system includes a lightweight one-step detection network for fruit recognition. As well as computer vision to analyze the point class and anticipate a correct approach position for each fruit before grabbing. Using the raw inputs from a high-resolution camera, fruit recognition and instance segmentation are done on RGB photos. The computer vision classification and grasping systems are integrated and outcomes from tree-grown foods are provided as input information and output methodology poses for every apple and orange to robotic arm execution. Before RGB picture data is acquired from laboratory and plantation environments, the developed vision method will be evaluated. Robot harvest experiment is conducted in indoor as well as outdoor to evaluate the proposed harvesting system's performance. The research findings suggest that the proposed vision technique can control robotic harvesting effectively and precisely where the success rate of identification is increased above 95% in case of post prediction process with reattempts of less than 12%.


Assuntos
Robótica , Frutas , Processamento de Imagem Assistida por Computador , Força da Mão , Visão Ocular
4.
Sci Rep ; 14(1): 4947, 2024 Feb 28.
Artigo em Inglês | MEDLINE | ID: mdl-38418484

RESUMO

Internet of Things (IoT) paves the way for the modern smart industrial applications and cities. Trusted Authority acts as a sole control in monitoring and maintaining the communications between the IoT devices and the infrastructure. The communication between the IoT devices happens from one trusted entity of an area to the other by way of generating security certificates. Establishing trust by way of generating security certificates for the IoT devices in a smart city application can be of high cost and expensive. In order to facilitate this, a secure group authentication scheme that creates trust amongst a group of IoT devices owned by several entities has been proposed. The majority of proposed authentication techniques are made for individual device authentication and are also utilized for group authentication; nevertheless, a unique solution for group authentication is the Dickson polynomial based secure group authentication scheme. The secret keys used in our proposed authentication technique are generated using the Dickson polynomial, which enables the group to authenticate without generating an excessive amount of network traffic overhead. IoT devices' group authentication has made use of the Dickson polynomial. Blockchain technology is employed to enable secure, efficient, and fast data transfer among the unique IoT devices of each group deployed at different places. Also, the proposed secure group authentication scheme developed based on Dickson polynomials is resistant to replay, man-in-the-middle, tampering, side channel and signature forgeries, impersonation, and ephemeral key secret leakage attacks. In order to accomplish this, we have implemented a hardware-based physically unclonable function. Implementation has been carried using python language and deployed and tested on Blockchain using Ethereum Goerli's Testnet framework. Performance analysis has been carried out by choosing various benchmarks and found that the proposed framework outperforms its counterparts through various metrics. Different parameters are also utilized to assess the performance of the proposed blockchain framework and shows that it has better performance in terms of computation, communication, storage and latency.

5.
Sci Rep ; 14(1): 843, 2024 01 08.
Artigo em Inglês | MEDLINE | ID: mdl-38191643

RESUMO

Detection and classification of epileptic seizures from the EEG signals have gained significant attention in recent decades. Among other signals, EEG signals are extensively used by medical experts for diagnosing purposes. So, most of the existing research works developed automated mechanisms for designing an EEG-based epileptic seizure detection system. Machine learning techniques are highly used for reduced time consumption, high accuracy, and optimal performance. Still, it limits by the issues of high complexity in algorithm design, increased error value, and reduced detection efficacy. Thus, the proposed work intends to develop an automated epileptic seizure detection system with an improved performance rate. Here, the Finite Linear Haar wavelet-based Filtering (FLHF) technique is used to filter the input signals and the relevant set of features are extracted from the normalized output with the help of Fractal Dimension (FD) analysis. Then, the Grasshopper Bio-Inspired Swarm Optimization (GBSO) technique is employed to select the optimal features by computing the best fitness value and the Temporal Activation Expansive Neural Network (TAENN) mechanism is used for classifying the EEG signals to determine whether normal or seizure affected. Numerous intelligence algorithms, such as preprocessing, optimization, and classification, are used in the literature to identify epileptic seizures based on EEG signals. The primary issues facing the majority of optimization approaches are reduced convergence rates and higher computational complexity. Furthermore, the problems with machine learning approaches include a significant method complexity, intricate mathematical calculations, and a decreased training speed. Therefore, the goal of the proposed work is to put into practice efficient algorithms for the recognition and categorization of epileptic seizures based on EEG signals. The combined effect of the proposed FLHF, FD, GBSO, and TAENN models might dramatically improve disease detection accuracy while decreasing complexity of system along with time consumption as compared to the prior techniques. By using the proposed methodology, the overall average epileptic seizure detection performance is increased to 99.6% with f-measure of 99% and G-mean of 98.9% values.


Assuntos
Epilepsia , Gafanhotos , Animais , Convulsões/diagnóstico , Epilepsia/diagnóstico , Redes Neurais de Computação , Eletroencefalografia
6.
Sci Rep ; 14(1): 386, 2024 Jan 03.
Artigo em Inglês | MEDLINE | ID: mdl-38172185

RESUMO

The Internet of Things (IoT) is extensively used in modern-day life, such as in smart homes, intelligent transportation, etc. However, the present security measures cannot fully protect the IoT due to its vulnerability to malicious assaults. Intrusion detection can protect IoT devices from the most harmful attacks as a security tool. Nevertheless, the time and detection efficiencies of conventional intrusion detection methods need to be more accurate. The main contribution of this paper is to develop a simple as well as intelligent security framework for protecting IoT from cyber-attacks. For this purpose, a combination of Decisive Red Fox (DRF) Optimization and Descriptive Back Propagated Radial Basis Function (DBRF) classification are developed in the proposed work. The novelty of this work is, a recently developed DRF optimization methodology incorporated with the machine learning algorithm is utilized for maximizing the security level of IoT systems. First, the data preprocessing and normalization operations are performed to generate the balanced IoT dataset for improving the detection accuracy of classification. Then, the DRF optimization algorithm is applied to optimally tune the features required for accurate intrusion detection and classification. It also supports increasing the training speed and reducing the error rate of the classifier. Moreover, the DBRF classification model is deployed to categorize the normal and attacking data flows using optimized features. Here, the proposed DRF-DBRF security model's performance is validated and tested using five different and popular IoT benchmarking datasets. Finally, the results are compared with the previous anomaly detection approaches by using various evaluation parameters.

7.
Sci Rep ; 14(1): 2487, 2024 01 30.
Artigo em Inglês | MEDLINE | ID: mdl-38291130

RESUMO

Pneumonia is a widespread and acute respiratory infection that impacts people of all ages. Early detection and treatment of pneumonia are essential for avoiding complications and enhancing clinical results. We can reduce mortality, improve healthcare efficiency, and contribute to the global battle against a disease that has plagued humanity for centuries by devising and deploying effective detection methods. Detecting pneumonia is not only a medical necessity but also a humanitarian imperative and a technological frontier. Chest X-rays are a frequently used imaging modality for diagnosing pneumonia. This paper examines in detail a cutting-edge method for detecting pneumonia implemented on the Vision Transformer (ViT) architecture on a public dataset of chest X-rays available on Kaggle. To acquire global context and spatial relationships from chest X-ray images, the proposed framework deploys the ViT model, which integrates self-attention mechanisms and transformer architecture. According to our experimentation with the proposed Vision Transformer-based framework, it achieves a higher accuracy of 97.61%, sensitivity of 95%, and specificity of 98% in detecting pneumonia from chest X-rays. The ViT model is preferable for capturing global context, comprehending spatial relationships, and processing images that have different resolutions. The framework establishes its efficacy as a robust pneumonia detection solution by surpassing convolutional neural network (CNN) based architectures.


Assuntos
Pneumonia , Infecções Respiratórias , Humanos , Raios X , Pneumonia/diagnóstico por imagem , Ciências Humanas , Radiografia
8.
Math Biosci Eng ; 20(12): 20828-20851, 2023 Nov 20.
Artigo em Inglês | MEDLINE | ID: mdl-38124578

RESUMO

The security of the Internet of Things (IoT) is crucial in various application platforms, such as the smart city monitoring system, which encompasses comprehensive monitoring of various conditions. Therefore, this study conducts an analysis on the utilization of blockchain technology for the purpose of monitoring Internet of Things (IoT) systems. The analysis is carried out by employing parametric objective functions. In the context of the Internet of Things (IoT), it is imperative to establish well-defined intervals for job execution, ensuring that the completion status of each action is promptly monitored and assessed. The major significance of proposed method is to integrate a blockchain technique with neuro-fuzzy algorithm thereby improving the security of data processing units in all smart city applications. As the entire process is carried out with IoT the security of data in both processing and storage units are not secured therefore confidence level of monitoring units are maximized at each state. Due to the integration process the proposed system model is implemented with minimum energy conservation where 93% of tasks are completed with improved security for about 90%.

9.
Sci Rep ; 13(1): 23041, 2023 Dec 27.
Artigo em Inglês | MEDLINE | ID: mdl-38155207

RESUMO

Unmanned aerial vehicles (UAVs) become a promising enabler for the next generation of wireless networks with the tremendous growth in electronics and communications. The application of UAV communications comprises messages relying on coverage extension for transmission networks after disasters, Internet of Things (IoT) devices, and dispatching distress messages from the device positioned within the coverage hole to the emergency centre. But there are some problems in enhancing UAV clustering and scene classification using deep learning approaches for enhancing performance. This article presents a new White Shark Optimizer with Optimal Deep Learning based Effective Unmanned Aerial Vehicles Communication and Scene Classification (WSOODL-UAVCSC) technique. UAV clustering and scene categorization present many deep learning challenges in disaster management: scene understanding complexity, data variability and abundance, visual data feature extraction, nonlinear and high-dimensional data, adaptability and generalization, real-time decision making, UAV clustering optimization, sparse and incomplete data. the need to handle complex, high-dimensional data, adapt to changing environments, and make quick, correct decisions in critical situations drives deep learning in UAV clustering and scene categorization. The purpose of the WSOODL-UAVCSC technique is to cluster the UAVs for effective communication and scene classification. The WSO algorithm is utilized for the optimization of the UAV clustering process and enables to accomplish effective communication and interaction in the network. With dynamic adjustment of the clustering, the WSO algorithm improves the performance and robustness of the UAV system. For the scene classification process, the WSOODL-UAVCSC technique involves capsule network (CapsNet) feature extraction, marine predators algorithm (MPA) based hyperparameter tuning, and echo state network (ESN) classification. A wide-ranging simulation analysis was conducted to validate the enriched performance of the WSOODL-UAVCSC approach. Extensive result analysis pointed out the enhanced performance of the WSOODL-UAVCSC method over other existing techniques. The WSOODL-UAVCSC method achieved an accuracy of 99.12%, precision of 97.45%, recall of 98.90%, and F1-score of 98.10% when compared to other existing techniques.

10.
Sci Rep ; 13(1): 20843, 2023 Nov 27.
Artigo em Inglês | MEDLINE | ID: mdl-38012161

RESUMO

The Internet of Things (IoT) involves the gathering of all those devices that connect to the Internet with the purpose of collecting and sharing data. The application of IoT in the different sectors, including health, industry has also picked up the threads to augment over the past few years. The IoT and, by integrity, the IIoT, are found to be highly susceptible to different types of threats and attacks owing to the networks nature that in turn leads to even poor outcomes (i.e., increasing error rate). Hence, it is critical to design attack detection systems that can provide the security of IIoT networks. To overcome this research work of IIoT attack detection in large amount of evolutions is failed to determine the certain attacks resulting in a minimum detection performance, reinforcement learning-based attack detection method called sliding principal component and dynamic reward reinforcement learning (SPC-DRRL) for detecting various IIoT network attacks is introduced. In the first stage of this research methodology, preprocessing of raw TON_IoT dataset is performed by employing min-max normalization scaling function to obtain normalized values with same scale. Next, with the processed sample data as output, to extract data from multi-sources (i.e., different service profiles from the dataset), a robust log likelihood sliding principal component-based feature extraction algorithm is applied with an arbitrary size sliding window to extract computationally-efficient features. Finally, dynamic reward reinforcement learning-based IIoT attack detection model is presented to control the error rate involved in the design. Here, with the design of dynamic reward function and introducing incident repository that not only generates the reward function in an arbitrary fashion but also stores the action results in the incident repository for the next training, therefore reducing the attack detection error rate. Moreover, an IIoT attack detection system based on SPC-DRRL is constructed. Finally, we verify the algorithm on the ToN_IoT dataset of University of New South Wales Australia. The experimental results show that the IIoT attack detection time and overhead along with the error rate are reduced considerably with higher accuracy than that of traditional reinforcement learning methods.

11.
Sci Rep ; 13(1): 15909, 2023 Sep 23.
Artigo em Inglês | MEDLINE | ID: mdl-37741875

RESUMO

The primary objective of this study is to delve into the application and validation of the Resistance Capacitance Optimization Algorithm (RCOA)-a new, physics-inspired metaheuristic optimization algorithm. The RCOA, intriguingly inspired by the time response of a resistance-capacitance circuit to a sudden voltage fluctuation, has been earmarked for solving complex numerical and engineering design optimization problems. Uniquely, the RCOA operates without any control/tunable parameters. In the first phase of this study, we evaluated the RCOA's credibility and functionality by deploying it on a set of 23 benchmark test functions. This was followed by thoroughly examining its application in eight distinct constrained engineering design optimization scenarios. This methodical approach was undertaken to dissect and understand the algorithm's exploration and exploitation phases, leveraging standard benchmark functions as the yardstick. The principal findings underline the significant effectiveness of the RCOA, especially when contrasted against various state-of-the-art algorithms in the field. Beyond its apparent superiority, the RCOA was put through rigorous statistical non-parametric testing, further endorsing its reliability as an innovative tool for handling complex engineering design problems. The conclusion of this research underscores the RCOA's strong performance in terms of reliability and precision, particularly in tackling constrained engineering design optimization challenges. This statement, derived from the systematic study, strengthens RCOA's position as a potentially transformative tool in the mathematical optimization landscape. It also paves the way for further exploration and adaptation of physics-inspired algorithms in the broader realm of optimization problems.

12.
Front Bioeng Biotechnol ; 11: 1211143, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37397968

RESUMO

Purpose: In the contemporary era, a significant number of individuals encounter various health issues, including digestive system ailments, even during their advanced years. The major purpose of this study is based on certain observations that are made in internal digestive systems in order to prevent severe cause that usually occurs in elderly people. Approach: To solve the purpose of the proposed method the proposed system is introduced with advanced features and parametric monitoring system that are based on wireless sensor setups. The parametric monitoring system is integrated with neural network where certain control actions are taken to prevent gastrointestinal activities at reduced data loss. Results: The outcome of the combined process is examined based on four different cases that is designed based on analytical model where control parameters and weight establishments are also determined. As the internal digestive system is monitored the data loss that is present with wireless sensor network must be reduced and proposed approach prevents such data loss with an optimized value of 1.39%. Conclusion: Parametric cases were conducted to evaluate the efficacy of neural networks. The findings indicate a significantly higher effectiveness rate of approximately 68% when compared to the control cases.

13.
PeerJ Comput Sci ; 9: e1308, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37346706

RESUMO

In the medical era, wearables often manage and find the specific data points to check important data like resting heart rate, ECG voltage, SPO2, sleep patterns like length, interruptions, and intensity, and physical activity like kind, duration, and levels. These digital biomarkers are created mainly through passive data collection from various sensors. The critical issues with this method are time and sensitivity. We reviewed the newest wireless communication trends employed in hospitals using wearable technology and privacy and Block chain to solve this problem. Based on sensors, this wireless technology controls the data gathered from numerous locations. In this study, the wearable sensor contains data from the various departments of the system. The gradient boosting method and the hybrid microwave transmission method have been proposed to find the location and convince people. The patient health decision has been submitted to hybrid microwave transmission using gradient boosting. This will help to trace the mobile phones using the calls from the threatening person, and the data is gathered from the database while tracing. From this concern, the data analysis process is based on decision-making. They adapted the data encountered by the detailed data in the statistical modeling of the system to produce exploratory data analysis for satisfying the data from the database. Complete data is classified with a 97% outcome by removing unwanted data and making it a 98% successful data classification.

15.
Sci Rep ; 13(1): 7107, 2023 05 02.
Artigo em Inglês | MEDLINE | ID: mdl-37131047

RESUMO

Many researchers have been interested in healthcare cybersecurity for a long time since it can improve the security of patient and health record data. As a result, a lot of research is done in the field of cybersecurity that focuses on the safe exchange of health data between patients and the medical setting. It still has issues with high computational complexity, increased time consumption, and cost complexity, all of which have an impact on the effectiveness and performance of the complete security system. Hence this work proposes a technique called Consultative Transaction Key Generation and Management (CTKGM) to enable secure data sharing in healthcare systems. It generates a unique key pair based on random values with multiplicative operations and time stamps. The patient data is then safely stored in discrete blocks of hash values using the blockchain methodology. The Quantum Trust Reconciliation Agreement Model (QTRAM), which calculates the trust score based on the feedback data, ensures reliable and secure data transfer. By allowing safe communication between patients and the healthcare system based on feedback analysis and trust value, the proposed framework makes a novel contribution to the field. Additionally, during communication, the Tuna Swarm Optimization (TSO) method is employed to validate nonce verification messages. Nonce message verification is a part of QTRAM that helps verify the users during transmission. The effectiveness of the suggested scheme has been demonstrated by comparing the obtained findings with other current state-of-the-art models after a variety of evaluation metrics have been analyzed to test the performance of this security model.


Assuntos
Blockchain , Humanos , Confiança , Registros Eletrônicos de Saúde , Segurança Computacional , Atenção à Saúde
16.
J Cloud Comput (Heidelb) ; 12(1): 38, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-36937654

RESUMO

The Industrial Internet of Things (IIoT) promises to deliver innovative business models across multiple domains by providing ubiquitous connectivity, intelligent data, predictive analytics, and decision-making systems for improved market performance. However, traditional IIoT architectures are highly susceptible to many security vulnerabilities and network intrusions, which bring challenges such as lack of privacy, integrity, trust, and centralization. This research aims to implement an Artificial Intelligence-based Lightweight Blockchain Security Model (AILBSM) to ensure privacy and security of IIoT systems. This novel model is meant to address issues that can occur with security and privacy when dealing with Cloud-based IIoT systems that handle data in the Cloud or on the Edge of Networks (on-device). The novel contribution of this paper is that it combines the advantages of both lightweight blockchain and Convivial Optimized Sprinter Neural Network (COSNN) based AI mechanisms with simplified and improved security operations. Here, the significant impact of attacks is reduced by transforming features into encoded data using an Authentic Intrinsic Analysis (AIA) model. Extensive experiments are conducted to validate this system using various attack datasets. In addition, the results of privacy protection and AI mechanisms are evaluated separately and compared using various indicators. By using the proposed AILBSM framework, the execution time is minimized to 0.6 seconds, the overall classification accuracy is improved to 99.8%, and detection performance is increased to 99.7%. Due to the inclusion of auto-encoder based transformation and blockchain authentication, the anomaly detection performance of the proposed model is highly improved, when compared to other techniques.

17.
PeerJ Comput Sci ; 9: e1709, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-38192458

RESUMO

The process of using robotic technology to examine underwater systems is still a difficult undertaking because the majority of automated activities lack network connectivity. Therefore, the suggested approach finds the main hole in undersea systems and fills it using robotic automation. In the predicted model, an analytical framework is created to operate the robot within predetermined areas while maximizing communication ranges. Additionally, a clustering algorithm with a fuzzy membership function is implemented, allowing the robots to advance in accordance with predefined clusters and arrive at their starting place within a predetermined amount of time. A cluster node is connected in each clustered region and provides the central control center with the necessary data. The weights are evenly distributed, and the designed robotic system is installed to prevent an uncontrolled operational state. Five different scenarios are used to test and validate the created model, and in each case, the proposed method is found to be superior to the current methodology in terms of range, energy, density, time periods, and total metrics of operation.

18.
Diagnostics (Basel) ; 12(11)2022 Nov 10.
Artigo em Inglês | MEDLINE | ID: mdl-36359592

RESUMO

To avoid dire situations, the medical sector must develop various methods for quickly and accurately identifying infections in remote regions. The primary goal of the proposed work is to create a wearable device that uses the Internet of Things (IoT) to carry out several monitoring tasks. To decrease the amount of communication loss as well as the amount of time required to wait before detection and improve detection quality, the designed wearable device is also operated with a multi-objective framework. Additionally, a design method for wearable IoT devices is established, utilizing distinct mathematical approaches to solve these objectives. As a result, the monitored parametric values are saved in a different IoT application platform. Since the proposed study focuses on a multi-objective framework, state design and deep learning (DL) optimization techniques are combined, reducing the complexity of detection in wearable technology. Wearable devices with IoT processes have even been included in current methods. However, a solution cannot be duplicated using mathematical approaches and optimization strategies. Therefore, developed wearable gadgets can be applied to real-time medical applications for fast remote monitoring of an individual. Additionally, the proposed technique is tested in real-time, and an IoT simulation tool is utilized to track the compared experimental results under five different situations. In all of the case studies that were examined, the planned method performs better than the current state-of-the-art methods.

19.
Diagnostics (Basel) ; 12(11)2022 Nov 17.
Artigo em Inglês | MEDLINE | ID: mdl-36428903

RESUMO

The majority of people in the modern biosphere struggle with depression as a result of the coronavirus pandemic's impact, which has adversely impacted mental health without warning. Even though the majority of individuals are still protected, it is crucial to check for post-corona virus symptoms if someone is feeling a little lethargic. In order to identify the post-coronavirus symptoms and attacks that are present in the human body, the recommended approach is included. When a harmful virus spreads inside a human body, the post-diagnosis symptoms are considerably more dangerous, and if they are not recognised at an early stage, the risks will be increased. Additionally, if the post-symptoms are severe and go untreated, it might harm one's mental health. In order to prevent someone from succumbing to depression, the technology of audio prediction is employed to recognise all the symptoms and potentially dangerous signs. Different choral characters are used to combine machine-learning algorithms to determine each person's mental state. Design considerations are made for a separate device that detects audio attribute outputs in order to evaluate the effectiveness of the suggested technique; compared to the previous method, the performance metric is substantially better by roughly 67%.

20.
Sensors (Basel) ; 22(19)2022 Sep 21.
Artigo em Inglês | MEDLINE | ID: mdl-36236264

RESUMO

There can be many inherent issues in the process of managing cloud infrastructure and the platform of the cloud. The platform of the cloud manages cloud software and legality issues in making contracts. The platform also handles the process of managing cloud software services and legal contract-based segmentation. In this paper, we tackle these issues directly with some feasible solutions. For these constraints, the Averaged One-Dependence Estimators (AODE) classifier and the SELECT Applicable Only to Parallel Server (SELECT-APSL ASA) method are proposed to separate the data related to the place. ASA is made up of the AODE and SELECT Applicable Only to Parallel Server. The AODE classifier is used to separate the data from smart city data based on the hybrid data obfuscation technique. The data from the hybrid data obfuscation technique manages 50% of the raw data, and 50% of hospital data is masked using the proposed transmission. The analysis of energy consumption before the cryptosystem shows the total packet delivered by about 71.66% compared with existing algorithms. The analysis of energy consumption after cryptosystem assumption shows 47.34% consumption, compared to existing state-of-the-art algorithms. The average energy consumption before data obfuscation decreased by 2.47%, and the average energy consumption after data obfuscation was reduced by 9.90%. The analysis of the makespan time before data obfuscation decreased by 33.71%. Compared to existing state-of-the-art algorithms, the study of makespan time after data obfuscation decreased by 1.3%. These impressive results show the strength of our methodology.


Assuntos
Algoritmos , Computação em Nuvem , Software
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...